|
In mathematics, Sylvester’s criterion is a necessary and sufficient criterion to determine whether a Hermitian matrix is positive-definite. It is named after James Joseph Sylvester. Sylvester's criterion states that a Hermitian matrix ''M'' is positive-definite if and only if all the following matrices have a positive determinant: * the upper left 1-by-1 corner of ''M'', * the upper left 2-by-2 corner of ''M'', * the upper left 3-by-3 corner of ''M'', * * ''M'' itself. In other words, all of the leading principal minors must be positive. An analogous theorem holds for characterizing positive-semidefinite Hermitian matrices, except that it is no longer sufficient to consider only the ''leading'' principal minors: a Hermitian matrix ''M'' is positive-semidefinite if and only if all principal minors of ''M'' are nonnegative.〔 == Proof == The proof is only for nonsingular Hermitian matrix with coefficients in , therefore only for nonsingular real-symmetric matrices. Positive definite or semidefinite matrix: A symmetric matrix ''A'' whose eigenvalues are positive (''λ'' > 0) is called positive definite, and when the eigenvalues are just nonnegative (''λ'' ≥ 0), ''A'' is said to be positive semidefinite. Theorem I: A real-symmetric matrix ''A'' has nonnegative eigenvalues if and only if ''A'' can be factored as ''A'' = ''B''T''B'', and all eigenvalues are positive if and only if ''B'' is nonsingular.〔 }=}=} \ge 0. |} Theorem II (The Cholesky decomposition): The symmetric matrix ''A'' possesses positive pivots if and only if ''A'' can be uniquely factored as ''A = RTR'', where ''R'' is an upper-triangular matrix with positive diagonal entries. This is known as the Cholesky decomposition of ''A'', and ''R'' is called the Cholesky factor of ''A''.〔 & =LU'= \begin 1 & 0 & \cdots & 0\\ \ell_ & 1 & \cdots & 0 \\ \vdots & \vdots & & \vdots \\ \ell_ & \ell_ & \cdots & 1 \end \begin u_ & u_ & \cdots & u_\\ 0 & u_ & \cdots & u_ \\ \vdots & \vdots & & \vdots \\ 0 & 0 & \cdots & u_ \end \\() & = LDU= \begin 1 & 0 & \cdots & 0\\ \ell_ & 1 & \cdots & 0 \\ \vdots & \vdots & & \vdots \\ \ell_ & \ell_ & \cdots & 1 \end \begin u_ & 0 & \cdots & 0\\ 0 & u_ & \cdots & 0 \\ \vdots & \vdots & & \vdots \\ 0 & 0 & \cdots & u_ \end \begin 1 & u_/u_ & \cdots & u_/u_ \\ 0 & 1 & \cdots & u_/u_ \\ \vdots & \vdots & & \vdots \\ 0 & 0 & \cdots & 1 \end \end By a uniqueness property of the ''LDU'' decomposition, the symmetry of ''A'' yields: ''U'' = ''LT'', consequently ''A'' = ''LDU'' = ''LDL''''T''. Setting ''R'' = ''D''1/2''L''''T'' where ''D''1/2 = diag( ''R'' = ''LD'', where ''L'' is a lower triangular matrix with a unit diagonal and ''D'' is the diagonal matrix whose diagonal entries are the ''rii'' ’s. Hence ''D'' has a positive diagonal and hence ''D'' is non-singular. Hence ''D''2 is a non-singular diagonal matrix. Also, ''L''''T'' is an upper triangular matrix with a unit diagonal. Consequently, ''A'' = ''LD''2''L''''T'' is the ''LDU'' factorization for ''A'', and thus the pivots must be positive because they are the diagonal entries in ''D''2. Uniqueness of the Cholesky decomposition: If we have another Cholesky decomposition ''A'' = ''R''1''R''1''T'' of ''A'', where ''R''1 is lower triangular with a positive diagonal, then similar to the above we may write ''R''1 = ''L''1''D''1, where ''L''1 is a lower triangular matrix with a unit diagonal and ''D''1 is a diagonal matrix whose diagonal entries are the same as the corresponding diagonal entries of ''R''1. Consequently, ''A'' = ''L''1''D''12''L''1''T'' is an ''LDU'' factorization for ''A''. By the uniqueness of the ''LDU'' factorization of ''A'', we have ''L''1 = ''L'' and ''D''12 = ''D''2. As both ''D''1 and ''D'' are diagonal matrices with positive diagonal entries, we have ''D''1 = ''D''. Hence ''R''1 = ''L''1''D''1 = ''LD'' = ''R''. Hence ''A'' has a unique Cholesky decomposition. |} Theorem III: Let ''A''''k'' be the ''k'' × ''k'' leading principal submatrix of ''A''''n''×''n''. If ''A'' has an ''LU'' factorization ''A'' = ''LU'', where ''L'' is a lower triangular matrix with a unit diagonal, then det(''A''''k'') = ''u''11''u''22 · · · ''u''''kk'', and the ''k''-th pivot is ''u''''kk'' = det(''A''1) = ''a''11 for ''k'' = 1, ''u''''kk'' = det(''A''''k'')/det(''A''''k''−1) for ''k'' = 2, 3, . . . , ''n'', where ''u''''jj'' is the (''j'', ''j'')-th entry of ''U'' for all ''j'' = 1, 2, . . . , ''n''.〔 Combining Theorem II with Theorem III yields: Statement I: If the symmetric matrix ''A'' can be factored as ''A=RTR'' where R is an upper-triangular matrix with positive diagonal entries, then all the pivots of ''A'' are positive (by Theorem II), therefore all the leading principal minors of ''A'' are positive (by Theorem III). Statement II: If the nonsingular ''n'' × ''n'' symmetric matrix ''A'' can be factored as , then the QR decomposition (closely related to Gram-Schmidt process) of ''B'' (''B'' = ''QR'') yields: , where ''Q'' is orthogonal matrix and ''R'' is upper triangular matrix. As ''A'' is non-singular and , it follows that all the diagonal entries of ''R'' are non-zero. Let ''r''''jj'' be the (''j'', ''j'')-th entry of ''E'' for all ''j'' = 1, 2, . . . , ''n''. Then ''r''''jj'' ≠ 0 for all ''j'' = 1, 2, . . . , ''n''. Let ''F'' be a diagonal matrix, and let ''f''''jj'' be the (''j'', ''j'')-th entry of ''F'' for all ''j'' = 1, 2, . . . , ''n''. For all ''j'' = 1, 2, . . . , ''n'', we set ''f''''jj'' = 1 if ''r''''jj'' > 0, and we set ''f''''jj'' = -1 if ''r''''jj'' < 0. Then , the ''n'' × ''n'' identity matrix. Let ''S''=''FR''. Then ''S'' is an upper-triangular matrix with all diagonal entries being positive. Hence we have , for some upper-triangular matrix ''S'' with all diagonal entries being positive. Namely Statement II requires the non-singularity of the symmetric matrix ''A''. Combining Theorem I with Statement I and Statement II yields: Statement III: If the real-symmetric matrix ''A'' is positive definite then ''A'' possess factorization of the form ''A'' = ''B''T''B'', where ''B'' is nonsingular (Theorem I), the expression ''A'' = ''B''''T''''B'' implies that ''A'' possess factorization of the form ''A'' = ''R''''T''''R'' where ''R'' is an upper-triangular matrix with positive diagonal entries (Statement II), therefore all the leading principal minors of ''A'' are positive (Statement I). In other words, Statement III proves the "only if" part of Sylvester's Criterion for non-singular real-symmetric matrices. Sylvester's Criterion: The real-symmetric matrix ''A'' is positive definite if and only if all the leading principal minors of ''A'' are positive. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Sylvester's criterion」の詳細全文を読む スポンサード リンク
|